9 research outputs found

    Hybrid fastslam approach using genetic algorithm and particle swarm optimization for robotic path planning

    Get PDF
    Simultaneous Localization and Mapping (SLAM) is an algorithmic technique being used for mobile robot to build and create a relative map in an unknown environment. FastSLAM is one of the SLAM algorithms, which is capable of speeding up convergence in robot’s path planning and environment map estimation. Besides, it is popular for its higher accuracy compared to other SLAM algorithms. However, the FastSLAM algorithm suffers from inconsistent results due to particle depletion problem over time. This research study aims to minimize the inconsistency in FastSLAM algorithm using two soft computing techniques, which are particle swarm optimization (PSO) and genetic algorithm (GA). To achieve this goal, a new hybrid approach based on the mentioned soft computing techniques is developed and integrated into the FastSLAM algorithm to improve its consistency. GA is used to optimize particle weight while PSO is used to optimize the robot’s estimation in generating an environment map to minimize particle depletion in FastSLAM algorithm. The performance of the proposed hybrid approach is evaluated using root mean square error (RMSE) analysis to measure degree of error during estimation of robot and landmark position. The results are verified using margin error analysis. With the percentage error analysis results, the new hybrid approach is able to minimize the problems in FastSLAM algorithm and managed to reduce the errors up to 33.373% for robot position and 27.482% for landmark set position

    Hybrid dragonfly algorithm with neighbourhood component analysis and gradient tree boosting for crime rates modelling

    Get PDF
    In crime studies, crime rates time series prediction helps in strategic crime prevention formulation and decision making. Statistical models are commonly applied in predicting time series crime rates. However, the time series crime rates data are limited and mostly nonlinear. One limitation in the statistical models is that they are mainly linear and are only able to model linear relationships. Thus, this study proposed a time series crime prediction model that can handle nonlinear components as well as limited historical crime rates data. Recently, Artificial Intelligence (AI) models have been favoured as they are able to handle nonlinear and robust to small sample data components in crime rates. Hence, the proposed crime model implemented an artificial intelligence model namely Gradient Tree Boosting (GTB) in modelling the crime rates. The crime rates are modelled using the United States (US) annual crime rates of eight crime types with nine factors that influence the crime rates. Since GTB has no feature selection, this study proposed hybridisation of Neighbourhood Component Analysis (NCA) and GTB (NCA-GTB) in identifying significant factors that influence the crime rates. Also, it was found that both NCA and GTB are sensitive to input parameter. Thus, DA2-NCA-eGTB model was proposed to improve the NCA-GTB model. The DA2-NCA-eGTB model hybridised a metaheuristic optimisation algorithm namely Dragonfly Algorithm (DA) with NCA-GTB model to optimise NCA and GTB parameters. In addition, DA2-NCA-eGTB model also improved the accuracy of the NCA-GTB model by using Least Absolute Deviation (LAD) as the GTB loss function. The experimental result showed that DA2-NCA-eGTB model outperformed existing AI models in all eight modelled crime types. This was proven by the smaller values of Mean Absolute Percentage Error (MAPE), which was between 2.9195 and 18.7471. As a conclusion, the study showed that DA2-NCA-eGTB model is statistically significant in representing all crime types and it is able to handle the nonlinear component in limited crime rate data well

    Hybrid Neighbourhood Component Analysis with Gradient Tree Boosting for Feature Selection in Forecasting Crime Rate

    Get PDF
    Crime forecasting is beneficial as it provides valuable information to the government and authorities in planning an efficient crime prevention measure. Most criminology studies found that influence from several factors, such as social, demographic, and economic factors, significantly affects crime occurrence. Therefore, most criminology experts and researchers' study and observe the effect of factors on criminal activities as it provides relevant insight into possible future crime trends. Based on the literature review, the applications of proper analysis in identifying significant factors that influence crime are scarce and limited. Therefore, this study proposed a hybrid model that integrates Neighbourhood Component Analysis (NCA) with Gradient Tree Boosting (GTB) in modelling the United States (US) crime rate data. NCA is a feature selection technique used in this study to identify the significant factors influencing crime rate. Once the significant factors were identified, an artificial intelligence technique, i.e., GTB, was implemented in modelling the crime data, where the crime rate value was predicted. The performance of the proposed model was compared with other existing models using quantitative measurement error analysis. Based on the result, the proposed NCA-GTB model outperformed other crime models in predicting the crime rate. As proven by the experimental result, the proposed model produced the smallest quantitative measurement error in the case study

    A proposed gradient tree boosting with different loss function in crime forecasting and analysis

    No full text
    Gradient tree boosting (GTB) is a newly emerging artificial intelligence technique in crime forecasting. GTB is a stage-wise additive framework that adopts numerical optimisation methods to minimise the loss function of the predictive model which later enhances it predictive capabilities. The applied loss function plays critical roles that determine GTB predictive capabilities and performance. GTB uses the least square function as its standard loss function. Motivated by this limitation, the study is conducted to observe and identify a potential replacement for the current loss function in GTB by applying a different existing standard mathematical function. In this study, the crime models are developed based on GTB with a different loss function to compare its forecasting performance. From this case study, it is found that among the tested loss functions, the least absolute deviation function outperforms other loss functions including the GTB standard least square loss function in all developed crime models

    Comparative study on artificial intelligence techniques in crime forecasting

    No full text
    An application of efficient crime analysis is beneficial and helpful to understand the behavior of trend and pattern of crimes. Crime forecasting is an area of research that assists authorities in enforcing early crime prevention measures. Statistical technique has been widely applied in the past to develop crime forecasting models. However, it has been observed that researchers have begun to shift their research interests from statistical model to artificial intelligence model in crime forecasting. Thus, this study is conducted to observe the capabilities of artificial intelligence technique in improving crime forecasting. The main objective of this study is to conduct a comparative analysis on forecasting performance capabilities of four artificial intelligence techniques, namely, artificial neural network (ANN), support vector regression (SVR), random forest (RF), and gradient tree boosting (GTB) in forecasting crime rate. Forecasting capability of each technique was assessed in terms of measurement of errors. From the result obtained, GTB showed the highest performance capability where it scored the lowest measurement of errors compared to SVR, RF, and ANN

    GA-PSO-FASTSLAM: A hybrid optimization approach in improving fastSLAM performance

    No full text
    FastSLAM algorithm is one of the introduced Simultaneous Localization and Mapping (SLAM) algorithms for autonomous mobile robot. It decomposes the SLAM problem into one distinct localization problem and a collection of landmarks estimation problems. In recent discovery, FastSLAM suffers particle depletion problem which causes it to degenerate over time in terms of accuracy. In this work, a new hybrid approach is proposed by integrating two soft computing techniques that are genetic algorithm (GA) and particle swarm optimization (PSO) into FastSLAM. It is developed to overcome the particle depletion problem occur by improving the FastSLAM accuracy in terms of robot and landmark set position estimation. The experiment is conducted in simulation where the result is evaluated using root mean square error (RMSE) analysis. The experiment result shows that the proposed hybrid approach able to minimize the FastSLAM problem by reducing the degree of error occurs (RMSE value) during robot and landmark set position estimation

    Transformer in mRNA Degradation Prediction

    No full text
    The unstable properties and the advantages of the mRNA vaccine have encouraged many experts worldwide in tackling the degradation problem. Machine learning models have been highly implemented in bioinformatics and the healthcare fieldstone insights from biological data. Thus, machine learning plays an important role in predicting the degradation rate of mRNA vaccine candidates. Stanford University has held an OpenVaccine Challenge competition on Kaggle to gather top solutions in solving the mentioned problems, and a multi-column root means square error (MCRMSE) has been used as a main performance metric. The Nucleic Transformer has been proposed by different researchers as a deep learning solution that is able to utilize a self-attention mechanism and Convolutional Neural Network (CNN). Hence, this paper would like to enhance the existing Nucleic Transformer performance by utilizing the AdaBelief or RangerAdaBelief optimizer with a proposed decoder that consists of a normalization layer between two linear layers. Based on the experimental result, the performance of the enhanced Nucleic Transformer outperforms the existing solution. In this study, the AdaBelief optimizer performs better than the RangerAdaBelief optimizer, even though it possesses Ranger’s advantages. The advantages of the proposed decoder can only be shown when there is limited data. When the data is sufficient, the performance might be similar but still better than the linear decoder if and only if the AdaBelief optimizer is used. As a result, the combination of the AdaBelief optimizer with the proposed decoder performs the best with 2.79% and 1.38% performance boost in public and private MCRMSE, respectively
    corecore